A 13-billion-parameter pre-trained model based on the LLaMa architecture, capable of translation, programming, text classification, information extraction, summarization, copywriting, commonsense Q&A, and mathematical calculations
Large Language Model
Transformers Supports Multiple Languages